Two-Stage Importance Sampling With Mixture Proposals
نویسندگان
چکیده
For importance sampling (IS), multiple proposals can be combined to address different aspects of a target distribution. There are various methods for IS with multiple proposals, including Hesterberg’s stratified IS estimator, Owen and Zhou’s regression estimator, and Tan’s maximum likelihood estimator. For the problem of efficiently allocating samples to different proposals, it is natural to use a pilot sample to select the mixture proportions before the actual sampling and estimation. However, most current discussions are in an empirical sense for such a two-stage procedure. In this article, we establish a theoretical framework of applying the two-stage procedure for various methods, including the asymptotic properties and the choice of the pilot sample size. By our simulation studies, these two-stage estimators can outperform estimators with naive choices of mixture proportions. Furthermore, while Owen and Zhou’s and Tan’s estimators are designed for estimating normalizing constants, we extend their usage and the two-stage procedure to estimating expectations and show that the improvement is still preserved in this extension.
منابع مشابه
Optimal mixture weights in multiple importance sampling
In multiple importance sampling we combine samples from a finite list of proposal distributions. When those proposal distributions are used to create control variates, it is possible (Owen and Zhou, 2000) to bound the ratio of the resulting variance to that of the unknown best proposal distribution in our list. The minimax regret arises by taking a uniform mixture of proposals, but that is cons...
متن کاملComputing Normalizing Constants for Finite Mixture Models via Incremental Mixture Importance Sampling (IMIS)
This article proposes a method for approximating integrated likelihoods in finite mixture models. We formulate the model in terms of the unobserved group memberships, z, and make them the variables of integration. The integral is then evaluated using importance sampling over the z. We propose an adaptive importance sampling function which is itself a mixture, with two types of component distrib...
متن کاملAdaptive Mixture Importance Sampling
Importance sampling involves approximation of functionals (such as expectations) of a target distribution by sampling from a design distribution. In many applications, it is natural or convenient to use a design distribution which is a mixture of given distributions. One typically has wide latitude in selecting the mixing probabilities of the design distribution. Furthermore, one can reduce var...
متن کاملC01\lPUTING NORJ.\tIALIZING CONSTANTS FOR FINITE MIXTURE MODELS VIA INCRElVlENTAL MIXTURE IMPORTANCE SAlVIPLING
We propose a method for approximating integrated likelihoods in finite mixture models. We formulate the model in terms of the unobserved group memberships, z, and make them the variables of integration. The integral is then evaluated using importance sampling over the z. We propose an adaptive importance sampling function which is itself a mixture, with two types of component distributions, one...
متن کاملNeural Block Sampling
Efficient Monte Carlo inference often requires manual construction of model-specific proposals. We propose an approach to automated proposal construction by training neural networks to provide fast approximations to block Gibbs conditionals. The learned proposals generalize to occurrences of common structural motifs both within a given model and across models, allowing for the construction of a...
متن کامل